A Robbins-Monro type learning algorithm for an entropy maximizing version of stochastic Optimality Theory

نویسنده

  • Markus Fischer
چکیده

The object of the present work is the analysis of the convergence behaviour of a learning algorithm for grammars belonging to a special version – the maximum entropy version – of stochastic Optimality Theory. Stochastic Optimality Theory is like its deterministic predecessor, namely Optimality Theory as introduced by Prince and Smolensky, in that both are theories of universal grammar in the sense of generative linguistics. We give formal de nitions of basic notions of stochastic Optimality Theory and introduce the learning problem as it appears in this context. A by now popular version of stochastic Optimality Theory is the one introduced by Boersma, which we brie y discuss. The maximum entropy version of stochastic Optimality Theory is derived in great generality from fundamental principles of information theory. The main part of this work is dedicated to the analysis of a learning algorithm proposed by Jäger (2003) for maximum entropy grammars. We show in which sense and under what conditions the algorithm converges. Connections with well known procedures and classical results are made explicit. The present work is a slightly modi ed version of my Master's thesis, which was submitted to the Department of German Language and Linguistics at Humboldt University Berlin in June 2005. The thesis was supervised by Prof. Manfred Krifka and Prof. Gerhard Jäger. Author's address: Markus Fischer Reichenberger Str. 166 10999 Berlin Germany E-Mail: [email protected]

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Joint Stochastic Approximation learning of Helmholtz Machines

Though with progress, model learning and performing posterior inference still remains a common challenge for using deep generative models, especially for handling discrete hidden variables. This paper is mainly concerned with algorithms for learning Helmholz machines, which is characterized by pairing the generative model with an auxiliary inference model. A common drawback of previous learning...

متن کامل

Pre-publicaciones Del Seminario Matematico 2002 Urn Models and Differential Algebraic Equations Urn Models and Differential Algebraic Equations *

A generalised urn model is presented in this paper. The urn contains L different types of balls and its replacement policy depends on both an urn function and a random environment. We consider the Ldimensional stochastic process {Xn} that represents the proportion of balls of each type in the urn after each replacement. This process can be expressed as a stochastic recurrent equation that fits ...

متن کامل

BUREAU OF THE CENSUS STATISTICAL RESEARCH DIVISION Statistical Research Report Series No. RR2001/02 Convergence of a Robbins-Monro Algorithm for Recursive Parameter Estimation with Non-Monotone Weights and Multiple Zeros

Convergence properties are established for the output of a deterministic Robbins-Monro recursion for functions that can have singularities and multiple zeros. Our analysis is built largely on adaptations of lemmas of Fradkov published in Russian. We present versions of these lemmas in English for the first time. A gap in Fradkov’s proof of the final lemma is fixed but only for the scalar case.

متن کامل

Convergence of a Stochastic Approximation Algorithm for the GI/G/1 Queue Using Infinitesimal Perturbation Analysis

Discrete-event systems to which the technique of infinitesimal perturbation analysis (IPA) is applicable are natural candidates for optimization via a Robbins-Monro type stochastic approximation algorithm. We establish a simple framework for single-run optimization of systems with regenerative structure. The main idea is to convert the original problem into one in which unbiased estimators can ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2005